# European language optimization
Pleias RAG 1B
Apache-2.0
Pleias-RAG-1B is a 1.2B-parameter compact reasoning model specifically designed for retrieval-augmented generation (RAG), search, and document summarization tasks. It excels in multilingual RAG tasks and supports structured citation generation.
Large Language Model
Transformers Supports Multiple Languages

P
PleIAs
1,474
47
Mistral 7B Instruct V0.3 GPTQ 8b
The Mistral-7B-Instruct-v0.3 quantized model developed by IST Austria, utilizing GPTQ technology for 8-bit quantization, suitable for multilingual text generation tasks.
Large Language Model
Transformers

M
cortecs
15
1
Llama 3 Wissenschaft 8B
Other
A multilingual hybrid model based on Llama-3-8b, integrating German, Italian, and English capabilities
Large Language Model
Transformers

L
nbeerbower
15
4
Mistral 7B OpenOrca Oasst Top1 2023 08 25 V3 Mistral 7B Instruct V0.1
Apache-2.0
This is a fusion model based on the Mistral-7B architecture, combining the strengths of Mistral-7B-Instruct and OpenOrca-oasst_top1 models, specializing in multilingual text generation tasks.
Large Language Model
Transformers Supports Multiple Languages

M
MaziyarPanahi
44
1
Featured Recommended AI Models